Curiosity-driven Exploration for Mapless Navigation with Deep Reinforcement Learning
نویسندگان
چکیده
This paper investigates exploration strategies of Deep Reinforcement Learning (DRL) methods to learn navigation policies for mobile robots. In particular, we augment the normal external reward for training DRL algorithms with intrinsic reward signals measured by curiosity. We test our approach in a mapless navigation setting, where the autonomous agent is required to navigate without the occupancy map of the environment, to targets whose relative locations can be easily acquired through low-cost solutions (e.g., visible light localization, Wi-Fi signal localization). We validate that the intrinsic motivation is crucial for improving DRL performance in tasks with challenging exploration requirements. Our experimental results show that our proposed method is able to more effectively learn navigation policies, and has better generalization capabilities in previously unseen environments. A video of our experimental results can be found at https://goo.gl/pWbpcF.
منابع مشابه
Learning tactile skills through curious exploration
We present curiosity-driven, autonomous acquisition of tactile exploratory skills on a biomimetic robot finger equipped with an array of microelectromechanical touch sensors. Instead of building tailored algorithms for solving a specific tactile task, we employ a more general curiosity-driven reinforcement learning approach that autonomously learns a set of motor skills in absence of an explici...
متن کاملCuriosity-Driven Exploration with Planning Trajectories
Reinforcement learning (RL) agents can reduce learning time dramatically by planning with learned predictive models. Such planning agents learn to improve their actions using planning trajectories, sequences of imagined interactions with the environment. However, planning agents are not intrinsically driven to improve their predictive models, which is a necessity in complex environments. This p...
متن کاملComputational Theories of Curiosity-Driven Learning
What are the functions of curiosity? What are the mechanisms of curiosity-driven learning? We approach these questions using concepts and tools from machine learning and developmental robotics. We argue that curiosity-driven learning enables organisms to make discoveries to solve complex problems with rare or deceptive rewards. By fostering exploration and discovery of a diversity of behavioura...
متن کاملIntrinsically motivated oculomotor exploration guided by uncertainty reduction and conditioned reinforcement in non-human primates
Intelligent animals have a high degree of curiosity--the intrinsic desire to know--but the mechanisms of curiosity are poorly understood. A key open question pertains to the internal valuation systems that drive curiosity. What are the cognitive and emotional factors that motivate animals to seek information when this is not reinforced by instrumental rewards? Using a novel oculomotor paradigm,...
متن کامل2018-00413 - Post-doctoral - Unsupervised learning with deep nets for intrinsically motivated exploration of dynamical systems
The Flowers team studies computational mechanisms allowing robots and humans to acquire openended repertoires of skills through life-long learning. This includes the processes for progressively discovering their bodies and interaction with objects, tools and others. In particular, we study mechanisms of intrinsically motivated learning (also called curiosity-driven active learning), autonomous ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2018